On the Best Rank-1 Approximation of Higher-Order Supersymmetric Tensors

نویسندگان

  • Eleftherios Kofidis
  • Phillip A. Regalia
چکیده

Recently the problem of determining the best, in the least-squares sense, rank-1 approximation to a higher-order tensor was studied and an iterative method that extends the wellknown power method for matrices was proposed for its solution. This higher-order power method is also proposed for the special but important class of supersymmetric tensors, with no change. A simplified version, adapted to the special structure of the supersymmetric problem, is deemed unreliable, as its convergence is not guaranteed. The aim of this paper is to show that a symmetric version of the above method converges under assumptions of convexity (or concavity) for the functional induced by the tensor in question, assumptions that are very often satisfied in practical applications. The use of this version entails significant savings in computational complexity as compared to the unconstrained higher-order power method. Furthermore, a novel method for initializing the iterative process is developed which has been observed to yield an estimate that lies closer to the global optimum than the initialization suggested before. Moreover, its proximity to the global optimum is a priori quantifiable. In the course of the analysis, some important properties that the supersymmetry of a tensor implies for its square matrix unfolding are also studied.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the successive supersymmetric rank-1 decomposition of higher-order supersymmetric tensors

In this paper, a successive supersymmetric rank-1 decomposition of a real higher-order supersymmetric tensor is considered. To obtain such a decomposition, we design a greedy method based on iteratively computing the best supersymmetric rank-1 approximation of the residual tensors. We further show that a supersymmetric canonical decomposition could be obtained when the method is applied to an o...

متن کامل

Best Rank-One Tensor Approximation and Parallel Update Algorithm for CPD

A novel algorithm is proposed for CANDECOMP/PARAFAC tensor decomposition to exploit best rank-1 tensor approximation. Different from the existing algorithms, our algorithm updates rank-1 tensors simultaneously in-parallel. In order to achieve this, we develop new all-at-once algorithms for best rank-1 tensor approximation based on the Levenberg-Marquardt method and the rotational update. We sho...

متن کامل

Orthogonal Rank-two Tensor Approximation: a Modified High-order Power Method and Its Convergence Analysis

With the notable exceptions that tensors of order 2, that is, matrices always have best approximations of arbitrary low ranks and that tensors of any order always have the best rank-one approximation, it is known that high-order tensors can fail to have best low rank approximations. When the condition of orthogonality is imposed, even at the most general case that only one pair of components in...

متن کامل

un 2 00 9 Subtracting a best rank - 1 approximation may increase tensor rank

It has been shown that a best rank-R approximation of an order-k tensor may not exist when R ≥ 2 and k ≥ 3. This poses a serious problem to data analysts using tensor decompositions. It has been observed numerically that, generally, this issue cannot be solved by consecutively computing and subtracting best rank-1 approximations. The reason for this is that subtracting a best rank-1 approximati...

متن کامل

Subtracting a best rank-1 approximation does not necessarily decrease tensor rank

It has been shown that a best rank-R approximation of an order-k tensor may not exist when R ≥ 2 and k ≥ 3. This poses a serious problem to data analysts using tensor decompositions. It has been observed numerically that, generally, this issue cannot be solved by consecutively computing and subtracting best rank-1 approximations. The reason for this is that subtracting a best rank-1 approximati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • SIAM J. Matrix Analysis Applications

دوره 23  شماره 

صفحات  -

تاریخ انتشار 2002